查看原文
其他

线性相关回归之R语言实现

2017-11-16 江利冰 临床科研与meta分析
> x1 <- c(9.9, 11.2, 9.4, 8.4, 14.8, 12.4, 13.1, 13.4, 11.2, 9.5, 10.7, 9.2) > x2 <- c(7.9, 8.9, 8.5, 9.4, 12, 11.5, 14.5, 12.3, 9.2, 11, 8.3, 8.5) > plot(x2 ~ x1, pch = 16)


> cor.test(x1, x2, method = "pearson") Pearson's product-moment correlation data: x1 and x2 t = 3.2854, df = 10, p-value = 0.008214 alternative hypothesis: true correlation is not equal to 0 95 percent confidence interval: 0.2499064 0.9157367 sample estimates: cor 0.7204761 > model <- lm(x2 ~ x1) ##建立线性回归 > abline(model) > summary(model) Call: lm(formula = x2 ~ x1) Residuals: Min 1Q Median 3Q Max -1.5658 -1.1169 -0.3129 0.6186 2.8291 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 1.8182 2.5775 0.705 0.49664 x1 0.7521 0.2289 3.285 0.00821 ** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 1.495 on 10 degrees of freedom Multiple R-squared: 0.5191, Adjusted R-squared: 0.471 F-statistic: 10.79 on 1 and 10 DF, p-value: 0.008214 > res <- residuals(model) ##生成残差 > shapiro.test(x1) Shapiro-Wilk normality test data: x1 W = 0.94989, p-value = 0.6354> shapiro.test(res) Shapiro-Wilk normality test data: res W = 0.90737, p-value = 0.1974


再来一例看看!!!


> x <- c(1, 3, 5 ,7, 9) > y <- c(8.03, 14.97, 19.23, 27.83, 36.23) > model <- lm(y ~ x) > summary(model) Call: lm(formula = y ~ x) Residuals: 1 2 3 4 5 0.624 0.638 -2.028 -0.354 1.120 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 3.9430 1.3151 2.998 0.057748 . x 3.4630 0.2289 15.127 0.000627 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 1.448 on 3 degrees of freedom Multiple R-squared: 0.9871, Adjusted R-squared: 0.9827 F-statistic: 228.8 on 1 and 3 DF, p-value: 0.0006272 > plot(y ~ x, pch=16, xaxt="n", yaxt="n", bty="n", xlim=c(0,10), ylim=c(0,40)) > axis(side= 1, at= seq(0,10,2)) ##设置坐标轴 > axis(side= 2, at= seq(0,40,5)) ##设置坐标轴 > abline(model) ##绘制图形



您可能也对以下帖子感兴趣

文章有问题?点此查看未经处理的缓存